The direct extension of ADMM for three-block separable convex minimization models is convergent when one function is strongly convex
نویسندگان
چکیده
The alternating direction method of multipliers (ADMM) is a benchmark for solving a two-block linearly constrained convex minimization model whose objective function is the sum of two functions without coupled variables. Meanwhile, it is known that the convergence is not guaranteed if the ADMM is directly extended to a multiple-block convex minimization model whose objective function has more than two functions. Recently, some authors have actively studied the strong convexity condition on the objective function to sufficiently ensure the convergence of the direct extension of ADMM or the resulting convergence when the original scheme is appropriately twisted. However, these strong convexity conditions still seem too strict to be satisfied by some applications for which the direct extension of ADMM work well; and the twisted schemes are less efficient or convenient to implement than the original scheme of the direct extension of ADMM. We are thus motivated to understand why the original scheme of the direct extension of ADMM works for some applications and under which realistic conditions its convergence can be guaranteed. We answer this question for the three-block case where there are three separable functions in the objective; and show that when one of them is strongly convex, the direct extension of ADMM is convergent. Note that the strong convexity of one function does hold for many applications. We further estimate the worst-case convergence rate measured by the iteration complexity in both the ergodic and nonergodic senses for the direct extension of ADMM, and show that its globally linear convergence in asymptotical sense can be guaranteed under some additional conditions.
منابع مشابه
The direct extension of ADMM for multi-block convex minimization problems is not necessarily convergent
The alternating direction method of multipliers (ADMM) is now widely used in many fields, and its convergence was proved when two blocks of variables are alternatively updated. It is strongly desirable and practically valuable to extend ADMM directly to the case of a multi-block convex minimization problem where its objective function is the sum of more than two separable convex functions. Howe...
متن کاملA Convergent 3-Block Semi-Proximal ADMM for Convex Minimization Problems with One Strongly Convex Block
In this paper, we present a semi-proximal alternating direction method of multipliers (ADMM) for solving 3-block separable convex minimization problems with the second block in the objective being a strongly convex function and one coupled linear equation constraint. By choosing the semi-proximal terms properly, we establish the global convergence of the proposed semi-proximal ADMM for the step...
متن کاملBlock-wise Alternating Direction Method of Multipliers for Multiple-block Convex Programming and Beyond
The alternating direction method of multipliers (ADMM) is a benchmark for solving a linearly constrained convex minimization model with a two-block separable objective function; and it has been shown that its direct extension to a multiple-block case where the objective function is the sum of more than two functions is not necessarily convergent. For the multipleblock case, a natural idea is to...
متن کاملOn the Proximal Jacobian Decomposition of ALM for Multiple-Block Separable Convex Minimization Problems and Its Relationship to ADMM
The augmented Lagrangian method (ALM) is a benchmark for solving convex minimization problems with linear constraints. When the objective function of the model under consideration is representable as the sum of some functions without coupled variables, a Jacobian or Gauss-Seidel decomposition is often implemented to decompose the ALM subproblems so that the functions’ properties could be used m...
متن کاملImproving an ADMM-like Splitting Method via Positive-Indefinite Proximal Regularization for Three-Block Separable Convex Minimization
Abstract. The augmented Lagrangian method (ALM) is fundamental for solving convex minimization models with linear constraints. When the objective function is separable such that it can be represented as the sum of more than one function without coupled variables, various splitting versions of the ALM have been well studied in the literature such as the alternating direction method of multiplier...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2014